Near-Threshold L1 Data Cache for Yield Management Under Process Variations

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dynamic thermal management for 3D multicore processors under process variations

Stacking core layers is emerging as an alternative for future high performance computing, but thermal problems have to be tackled first. When adaptive voltage scaling is adopted to hide the growing variation in the performance of cores, as a result, heat generation of each core varies. By exploiting the static thermal characteristics, the efficiency of dynamic thermal management can be improved...

متن کامل

Rethinking Last-Level Cache Management for Multicores Operating at Near-Threshold Voltages

Near-threshold voltage computing promises an order of magnitude improvement in energy efficiency, enabling future processors to integrate 100s of cores running concurrently. However, such low voltage operation accompanies extreme parametric variations, resulting in unreliable operation of the processor. The memory bit-cells in on-chip caches are most vulnerable to failure due to their tight fun...

متن کامل

Mitigating Process Variability for Non-Volatile Cache Resilience and Yield

While inclusion of emerging technology-based Non-Volatile Memory (NVM) devices in on-chip memory subsystems offers excellent potential for energy savings and scalability, their sensing vulnerability creates Process Variation (PV) challenges. This paper presents a circuit-architecture cross-layer solution to realize a radically-different approach to leveraging as-built variations via specific Se...

متن کامل

Cache management for shared sequential data access

This paper presents a new set of cache management algorithms for shared data objects that are accessed sequentially. I/O delays on sequentially accessed data is a dominant performance factor in many application domains, in particular for batch processing. Our algorithms fall into three classes: replacement, prefetching and scheduling strategies. Our replacement algorithms empirically estimate t...

متن کامل

L1 Data Cache Power Reduction Using a Forwarding Predictor

In most modern processor designs the L1 data cache has become a major consumer of power due to its increasing size and high frequency access rate. In order to reduce this power consumption, we propose in this paper a straightforward filtering technique. The mechanism is based on a highly accurate forwarding predictor that determines if a load instruction will take its corresponding data via for...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2020

ISSN: 2169-3536

DOI: 10.1109/access.2020.2968603